Transition-based Dependency Parsing Using Two Heterogeneous Gated Recursive Neural Networks

نویسندگان

  • Xinchi Chen
  • Yaqian Zhou
  • Chenxi Zhu
  • Xipeng Qiu
  • Xuanjing Huang
چکیده

Recently, neural network based dependency parsing has attracted much interest, which can effectively alleviate the problems of data sparsity and feature engineering by using the dense features. However, it is still a challenge problem to sufficiently model the complicated syntactic and semantic compositions of the dense features in neural network based methods. In this paper, we propose two heterogeneous gated recursive neural networks: tree structured gated recursive neural network (Tree-GRNN) and directed acyclic graph structured gated recursive neural network (DAG-GRNN). Then we integrate them to automatically learn the compositions of the dense features for transition-based dependency parsing. Specifically, Tree-GRNN models the feature combinations for the trees in stack, which already have partial dependency structures. DAG-GRNN models the feature combinations of the nodes whose dependency relations have not been built yet. Experiment results on two prevalent benchmark datasets (PTB3 and CTB5) show the effectiveness of our proposed model.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Transition-based Dependency Parsing Using Recursive Neural Networks

In this work, we present a general compositional vector framework for transitionbased dependency parsing. The ability to use transition-based algorithms allows for the application of vector composition to a large set of languages where only dependency treebanks are available, as well as handling linguistic phenomena such as non-projectivities which pose problems for previously proposed methods....

متن کامل

Second Exam: Natural Language Parsing with Neural Networks

With the advent of “deep learning”, there has been a recent resurgence of interest in the use of artificial neural networks for machine learning. This paper presents an overview of recent research in the statistical parsing of natural language sentences using such neural networks as a learning model. Though it is a fairly new addition to the toolset in this area, important results have been rec...

متن کامل

Efficient Structured Inference for Transition-Based Parsing with Neural Networks and Error States

Transition-based approaches based on local classification are attractive for dependency parsing due to their simplicity and speed, despite producing results slightly below the state-of-the-art. In this paper, we propose a new approach for approximate structured inference for transition-based parsing that produces scores suitable for global scoring using local models. This is accomplished with t...

متن کامل

An improved joint model: POS tagging and dependency parsing

Dependency parsing is a way of syntactic parsing and a natural language that automatically analyzes the dependency structure of sentences, and the input for each sentence creates a dependency graph. Part-Of-Speech (POS) tagging is a prerequisite for dependency parsing. Generally, dependency parsers do the POS tagging task along with dependency parsing in a pipeline mode. Unfortunately, in pipel...

متن کامل

Parsing with Lexicalized Probabilistic Recursive Transition Networks

We present a formalization of lexicalized Recursive Transition Networks which we call Automaton-Based Generative Dependency Grammar (gdg). We show how to extract a gdg from a syntactically annotated corpus, present a chart parser for gdg, and discuss different probabilistic models which are directly implemented in the finite automata and do not affect the parser.

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2015